Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Irishman

macrumors 68040
Original poster
Nov 2, 2006
3,401
845
So, with Leopard on the horizon comes the word that it will upgrade your OpenGL from 2.0 to 2.1. Good move Apple. So, feature for feature, how well does OpenGL 2.1 fare against DirectX10?

We've all seen the videos of new DX10 games like Crysis and see the jaw-dropping difference some of these advanced features can make:

time of day lighting
real-time ambient lightmaps
dynamic soft shadows
lightbeams
long-range viewdistance
parallax occlusion mapping
motion blur
depth of field

So, does anyone know if the newest OpenGL can take advantage of these awesome-looking features?

And is it going to be moot, because OpenGL 3.0 is supposed to be 3 months beyond that?
 

Eric5h5

macrumors 68020
Dec 9, 2004
2,489
591
You don't actually need OpenGL 2.0 or DX10 to do any of those features....

--Eric
 

Irishman

macrumors 68040
Original poster
Nov 2, 2006
3,401
845
Well, then why are we only now seeing games that look this good? And what are they adding to the feature set if not NEW features?
 

shoelessone

macrumors 6502
Jul 17, 2007
347
0
You don't actually need OpenGL 2.0 or DX10 to do any of those features....

--Eric

Huh?

You realize that there is a reason they are releasing DX10 right?

I mean, sure, I guess you could "do any of those features", but DX10 makes it easier/more practical.
 

motulist

macrumors 601
Dec 2, 2003
4,235
611
Maybe I'm just an old fart, but the graphical differences between different systems these days really don't seem to make any sort of significant difference in how fun a game is or how much I want to play it. Like "Ooo, on this system you can see the sun light glistening on the waves of the water, but on this other system you can only see a general shimmer of light on the waves." Who cares? Why does that make a difference in how fun the game is to you or how much you want to play it?

I'm going to clarify what I mean by using consoles as a stand in for generations of game play which includes the computer games of their time. The technological difference from atari to nintendo was huge, it literally changed the type of game you could play. Same with the jump from nintendo to genesis, and genesis to PS1. Each jump allowed you to play an entirely different type of game than the system before it and at an order of magnitude better graphical experience.

But at the point where you went from PS1 to PS2, the returns on technological advances really started to diminish. PS2 games had much smoother polygon shapes than PS1, and much less slow down in busy scenes and what not. But it didn't really change the type of game you could play, and it wasn't an order of magnitude better in it's graphics, it just made objects look significantly smoother and detailed.

Now with the jump from PS2 era games to PS3 era games, there is absolutely no difference in the type of games you can play and the graphics difference is pretty insignificant. Sure, things look yet again smoother and more realistic than the previous generation, but not in a way that really changes how much you can get your head into the game.

All else being equal, given a choice between slightly better graphics or slightly worse graphics, of course everyone will pick slightly better graphics. But at this point, every single system offers graphics that allow you to totally imagine the scene on your monitor is real, so differences between the systems really shouldn't matter to anyone any more.

Having lived through many generations of games I can tell you from experience - The only thing that matters in a game is how fun it is to play. There are many older games I prefer to playing than many newer games, and vice versa, there are many newer games I prefer to play over many older games. The graphics have nothing to do with how fun a game is to play, the only thing that matters is the gameplay itself.
 

Irishman

macrumors 68040
Original poster
Nov 2, 2006
3,401
845
Maybe I'm just an old fart, but the graphical differences between different systems these days really don't seem to make any sort of significant difference in how fun a game is or how much I want to play it. Like "Ooo, on this system you can see the sun light glistening on the waves of the water, but on this other system you can only see a general shimmer of light on the waves." Who cares? Why does that make a difference in how fun the game is to you or how much you want to play it?

I'm going to clarify what I mean by using consoles as a stand in for generations of game play which includes the computer games of their time. The technological difference from atari to nintendo was huge, it literally changed the type of game you could play. Same with the jump from nintendo to genesis, and genesis to PS1. Each jump allowed you to play an entirely different type of game than the system before it and at an order of magnitude better graphical experience.

But at the point where you went from PS1 to PS2, the returns on technological advances really started to diminish. PS2 games had much smoother polygon shapes than PS1, and much less slow down in busy scenes and what not. But it didn't really change the type of game you could play, and it wasn't an order of magnitude better in it's graphics, it just made objects look significantly smoother and detailed.

Now with the jump from PS2 era games to PS3 era games, there is absolutely no difference in the type of games you can play and the graphics difference is pretty insignificant. Sure, things look yet again smoother and more realistic than the previous generation, but not in a way that really changes how much you can get your head into the game.

All else being equal, given a choice between slightly better graphics or slightly worse graphics, of course everyone will pick slightly better graphics. But at this point, every single system offers graphics that allow you to totally imagine the scene on your monitor is real, so differences between the systems really shouldn't matter to anyone any more.

Having lived through many generations of games I can tell you from experience - The only thing that matters in a game is how fun it is to play. There are many older games I prefer to playing than many newer games, and vice versa, there are many newer games I prefer to play over many older games. The graphics have nothing to do with how fun a game is to play, the only thing that matters is the gameplay itself.

For you maybe, but not for many of us. Realism, the ability to simulate reality to a degree of fidelity not possible before is VERY attractive. I remember playing Unreal when it first came out, and sure, the gameplay was there, but my jaw dropped at the realism, partially due to the graphics, and partially due to the sound. I am a BIG fan of immersion in a game, and anything that assists in getting me to that point in playing a game, I'm all for.

So, gameplay for me, is just one part of a plethora of elements that come together to make a GREAT game.

Oh, and PS, I'm an old fart too. (38), and got my gaming start with Pong on a Radio Shack console.
 

Eric5h5

macrumors 68020
Dec 9, 2004
2,489
591
Well, then why are we only now seeing games that look this good? And what are they adding to the feature set if not NEW features?

Easier/more optimized/better ways to do OLD features, I would expect. As for "why now", because the latest graphics cards are, as always, faster than the older ones. Which means you can do more effects and still get decent framerates, no matter what version of the graphics API it is that you're using. Seriously, pretty much all of the features you listed have been done before, but probably not all at once. ;)

--Eric
 

CANEHDN

macrumors 6502a
Dec 12, 2005
855
0
Eagle Mountain, UT
For you maybe, but not for many of us. Realism, the ability to simulate reality to a degree of fidelity not possible before is VERY attractive. I remember playing Unreal when it first came out, and sure, the gameplay was there, but my jaw dropped at the realism, partially due to the graphics, and partially due to the sound. I am a BIG fan of immersion in a game, and anything that assists in getting me to that point in playing a game, I'm all for.

So, gameplay for me, is just one part of a plethora of elements that come together to make a GREAT game.

Oh, and PS, I'm an old fart too. (38), and got my gaming start with Pong on a Radio Shack console.

Some nerds need that realism from computer games because their lives just don't have enough.
 

contoursvt

macrumors 6502a
Jul 22, 2005
832
0
Right because I see tons of realistic video games based on "driving to work" or "grocery shopping" or "a day in the office". :rolleyes: Maybe your life is pretty exciting so your realism may include day to day activities such as:

-Air to air combat and dogfights
-Auto racing
-WWII battles or other war activities
-Battling mean creatures that jump out at you from the shadows

...or maybe the problem is that your machine is too low end for serious visuals and you'd have to upgrade and its just easier tomake fun of people who do have serious hardware and can run decent games by calling them nerds and questioning their choices.

Some nerds need that realism from computer games because their lives just don't have enough.
 

Aranince

macrumors 65816
Apr 18, 2007
1,104
0
California
So, with Leopard on the horizon comes the word that it will upgrade your OpenGL from 2.0 to 2.1. Good move Apple. So, feature for feature, how well does OpenGL 2.1 fare against DirectX10?

We've all seen the videos of new DX10 games like Crysis and see the jaw-dropping difference some of these advanced features can make:

time of day lighting
real-time ambient lightmaps
dynamic soft shadows
lightbeams
long-range viewdistance
parallax occlusion mapping
motion blur
depth of field

So, does anyone know if the newest OpenGL can take advantage of these awesome-looking features?

And is it going to be moot, because OpenGL 3.0 is supposed to be 3 months beyond that?

You can do any of those in DirectX9 and OpenGL. The reason why we are just now seeing these effects more is A) The hardware is coming out that can handle all of these at once. B) DirectX10 and OpenGL 2/3 just takes advantage of the more powerful hardware and the architecture behind it better then the old versions.

It will be a very long time before anyone is running Crysis in High/Ultra High...atleast 2 or 3 years.
 

contoursvt

macrumors 6502a
Jul 22, 2005
832
0
You can do all of those in DX9 however implementing them in DX9 is not as efficient so there is a benefit to DX10. You'd need much more horsepower to get the same visuals out of DX9 than with DX10.

For example if you took an 8800GTX and presented certain advanced visuals with DX10, it may run at 60fps (lets just pretend), running DX9 to achieve the same visuals may only get you 20fps. There was actually an article online that talked about the performance hit and it was huge.


Also I'm willing to bet that Crysis will be able to run on high detail (minus FSAA) on a single 8800GTS or GTX card at even 1600x1200 resolutions.
 

Irishman

macrumors 68040
Original poster
Nov 2, 2006
3,401
845
You can do any of those in DirectX9 and OpenGL. The reason why we are just now seeing these effects more is A) The hardware is coming out that can handle all of these at once. B) DirectX10 and OpenGL 2/3 just takes advantage of the more powerful hardware and the architecture behind it better then the old versions.

It will be a very long time before anyone is running Crysis in High/Ultra High...atleast 2 or 3 years.

I don't know anyone, being new and all, but do we have that many programmers here?
 

Eric5h5

macrumors 68020
Dec 9, 2004
2,489
591
I don't know anyone, being new and all, but do we have that many programmers here?

We have some programmers here, but you don't have to be a programmer to look at engines like Unreal Engine 3 (or Unity, for something Mac-specific), which does most of that stuff without DX10.

--Eric
 

contoursvt

macrumors 6502a
Jul 22, 2005
832
0
We have some programmers here, but you don't have to be a programmer to look at engines like Unreal Engine 3 (or Unity, for something Mac-specific), which does most of that stuff without DX10.

--Eric

I was under the impression that the unreal3 engine will have the ability to use directx10. Crysis is the same, DX9 and DX10 capability but if you take a look at some screenshots of DX9 vs 10 on the same games you will see that there is quite a bit more realism and detail they are able to squeeze out. Sure you can still play with DX9 but DX10 will definitly have the edge.
 

Irishman

macrumors 68040
Original poster
Nov 2, 2006
3,401
845
I agree with Contours. Eric seems to be making it sound like there's no real reason for DX10 to exist other than to make Microsoft and graphics cards makers more money.

I thought I made it clear that I am talking about the combination of ALL those features together, taken to a whole new level of realism and immersion.

THAT is new.
 

fblack

macrumors 6502a
May 16, 2006
528
1
USA
For you maybe, but not for many of us. Realism, the ability to simulate reality to a degree of fidelity not possible before is VERY attractive. I remember playing Unreal when it first came out, and sure, the gameplay was there, but my jaw dropped at the realism, partially due to the graphics, and partially due to the sound. I am a BIG fan of immersion in a game, and anything that assists in getting me to that point in playing a game, I'm all for.

So, gameplay for me, is just one part of a plethora of elements that come together to make a GREAT game.

Oh, and PS, I'm an old fart too. (38), and got my gaming start with Pong on a Radio Shack console.

Yea I remember my jaw dropping when Unreal first came out and it was a loong way from playing pong on my Telstar. But I also remember that after the initial wow faded my head screamed in anguish at the sheer drudgery of the game. This is in contrast to the sheer addiction of playing Master of Orion2 (one more turn) which didn't have great graphics or the immersive quality of Call of Duty where I had to peel my face off the screen when I stopped. Did you ever play Fallout? The graphics were ok, but the storyline and multiple endings rocked! So I tend to agree with motulist fun is more important than the eye candy. Don't get me wrong you still need graphics, I'm not advocating going back and playing Zork, but story and gameplay are the major elements that make a game worth playing.:)
 

killmoms

macrumors 68040
Jun 23, 2003
3,752
55
Durham, NC
You can do all of those in DX9 however implementing them in DX9 is not as efficient so there is a benefit to DX10. You'd need much more horsepower to get the same visuals out of DX9 than with DX10.

For example if you took an 8800GTX and presented certain advanced visuals with DX10, it may run at 60fps (lets just pretend), running DX9 to achieve the same visuals may only get you 20fps. There was actually an article online that talked about the performance hit and it was huge.

That doesn't have to do with DX9 vs DX10 themselves though. It has to do with the minimum Shader Model level specified by DX9 vs DX10. AKA, what it REALLY is talking about is the capabilities of the hardware as defined in the DirectX standard. That's why people are saying the same things are possible in OpenGL, if you're using the same level of hardware.
 

Eric5h5

macrumors 68020
Dec 9, 2004
2,489
591
I agree with Contours. Eric seems to be making it sound like there's no real reason for DX10 to exist other than to make Microsoft and graphics cards makers more money.

Er, no. In your first post, you listed a number of features that already exist without DX10, and asked when OpenGL would be able to do that stuff. I and others told you that it already can. The main reason for advancing a 3D graphics API at this point is to make it easier/faster to do fancy effects, and to take advantage of hardware features that also make it easier/faster to do effects. Direct3D and OpenGL will continue to evolve in that direction, and in no way did I imply that it was unneccessary. There just seemed to be some misapprehension that DX10 was providing some new effects that had never been seen before.

--Eric
 

Irishman

macrumors 68040
Original poster
Nov 2, 2006
3,401
845
Er, no. In your first post, you listed a number of features that already exist without DX10, and asked when OpenGL would be able to do that stuff. I and others told you that it already can. The main reason for advancing a 3D graphics API at this point is to make it easier/faster to do fancy effects, and to take advantage of hardware features that also make it easier/faster to do effects. Direct3D and OpenGL will continue to evolve in that direction, and in no way did I imply that it was unneccessary. There just seemed to be some misapprehension that DX10 was providing some new effects that had never been seen before.

--Eric

Then why not just say what you JUST said at the outset? It would have avoided confusion about what you really meant. The prior comment just sounded dismissive.
 

icecone

macrumors regular
Jun 8, 2007
168
0
Er, no. In your first post, you listed a number of features that already exist without DX10, and asked when OpenGL would be able to do that stuff. I and others told you that it already can. The main reason for advancing a 3D graphics API at this point is to make it easier/faster to do fancy effects, and to take advantage of hardware features that also make it easier/faster to do effects. Direct3D and OpenGL will continue to evolve in that direction, and in no way did I imply that it was unneccessary. There just seemed to be some misapprehension that DX10 was providing some new effects that had never been seen before.

--Eric
You are right, but the fact is that for the same game, DX10 provides you better graphics. (You can do the same thing with DX9, but the developers are too easy to write the code.)
 

Krevnik

macrumors 601
Sep 8, 2003
4,100
1,309
So, with Leopard on the horizon comes the word that it will upgrade your OpenGL from 2.0 to 2.1. Good move Apple. So, feature for feature, how well does OpenGL 2.1 fare against DirectX10?

Leopard's OGL implementation isn't going to provide the framework to do that like DX10 does, but as mentioned, you don't /need/ DX10 or OGL3 to do it, just that they make it easier.

The /big/ thing for OGL on Leopard is the new implementation in bytecode. While running OpenGL in a virtual machine like Java (using a low-level VM, LVM) doesn't sound like it would improve things... it greatly lowers CPU usage by games. WoW went from 125% CPU to 25% CPU usage because of the changes. It leaves a lot more CPU for developers to leverage for other advanced features.
 

Siemova

macrumors member
Dec 2, 2002
43
0
Texas
Isn't 2.1 something like a year old already? I know Apple announced back then that they'd support it, but now that 3.0 is close I'd hope they would either include it instead or provide it via Software Update soon.

Wikipedia's article on 3.0 was an interesting read. Basically, yeah, it just makes it easier and less system-intensive to do what's already possible (although it also lays the groundwork for future improvements). Personally, I think that in itself is a pretty nice advancement since, in addition to giving us better performance, it'll help attract game developers and make development itself easier.
 

rbarris

macrumors 6502
Oct 28, 2003
358
0
Irvine CA
Leopard's OGL implementation isn't going to provide the framework to do that like DX10 does, but as mentioned, you don't /need/ DX10 or OGL3 to do it, just that they make it easier.

The /big/ thing for OGL on Leopard is the new implementation in bytecode. While running OpenGL in a virtual machine like Java (using a low-level VM, LVM) doesn't sound like it would improve things... it greatly lowers CPU usage by games. WoW went from 125% CPU to 25% CPU usage because of the changes. It leaves a lot more CPU for developers to leverage for other advanced features.

This isn't quite on target. The new Leopard OpenGL includes a technology known as LLVM, which is a dynmic code generator or JIT, and is invoked (as far as I can tell) if the application gives GL a shader that cannot run directly on the hardware of the GPU - examples would be, vertex shaders on Intel GMA graphics type systems.

Pre-Leopard OpenGL already did a form of dynamic code generation for those cases, but LLVM does it better.

It isn't accurate to say that the new GL is based on bytecode or runs in some kind of virtual machine. It's a large C library.

A goal for a developer writing a high performing app is to stay off as many of those paths that might invoke the dynamic code generator as posible, because all those paths lead to cycles being spent on the CPU instead of the GPU. In the case of the GMA 950 it is unavoidable (no VS HW, it can only do the pixel shaders) - but LLVM will do a better job there than the old code in Tiger.

A good app running on a discrete GPU such as ATI or NVIDIA parts should never have to invoke any LLVM-generated code if it's set up right.

BTW, this change of adding LLVM to GL in Leopard really had no connection with the speedups on WoW starting in Intel Mac GL in 10.4.6 and continuing through 10.4.9. Those came from other factors (new GL extensions and multi-threaded driver).
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.